Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
6 billion parameters
# 6 billion parameters
Gpt J 6b
Apache-2.0
GPT-J 6B is a 6-billion-parameter autoregressive language model trained using the Mesh Transformer JAX framework, employing the same tokenizer as GPT-2/3.
Large Language Model
English
G
EleutherAI
297.31k
1,493
Featured Recommended AI Models
Empowering the Future, Your AI Solution Knowledge Base
English
简体中文
繁體中文
にほんご
© 2025
AIbase